Minimax Optimal Additive Functional Estimation with Discrete Distribution: Slow Divergence Speed Case

نویسندگان

  • Kazuto Fukuchi
  • Jun Sakuma
چکیده

This paper addresses an estimation problem of an additive functional of φ, which is defined as θ(P ;φ) = ∑ k i=1 φ(pi), given n i.i.d. random samples drawn from a discrete distribution P = (p1, ..., pk) with alphabet size k. We have revealed in the previous paper [1] that the minimax optimal rate of this problem is characterized by the divergence speed of the fourth derivative of φ in a range of fast divergence speed. In this paper, we prove this fact for a more general range of the divergence speed. As a result, we show the minimax optimal rate of the additive functional estimation for each range of the parameter α of the divergence speed. For α ∈ (1, 3/2), we show that the minimax rate is 1 n + k 2 (n lnn)2α . Besides, we show that the minimax rate is 1 n for α ∈ [3/2, 2].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimax Estimator of a Lower Bounded Parameter of a Discrete Distribution under a Squared Log Error Loss Function

The problem of estimating the parameter ?, when it is restricted to an interval of the form , in a class of discrete distributions, including Binomial Negative Binomial discrete Weibull and etc., is considered. We give necessary and sufficient conditions for which the Bayes estimator of with respect to a two points boundary supported prior is minimax under squared log error loss function....

متن کامل

Minimax Estimation of KL Divergence between Discrete Distributions

We refine the general methodology in [1] for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions with support size S comparable with the number of observations n. Specifically, we determine the “smooth” and “non-smooth” regimes based on the confidence set and the smo...

متن کامل

Minimax estimation of the L1 distance

We consider the problem of estimating the L1 distance between two discrete probability measures P and Q from empirical data in a nonasymptotic and large alphabet setting. When Q is known and one obtains n samples from P , we show that for every Q, the minimax rate-optimal estimator with n samples achieves performance comparable to that of the maximum likelihood estimator (MLE) with n lnn sample...

متن کامل

Minimax Estimation of Discrete Distributions under $\ell_1$ Loss

We consider the problem of discrete distribution estimation under l1 loss. We provide tight upper and lower bounds on the maximum risk of the empirical distribution (the maximum likelihood estimator), and the minimax risk in regimes where the support size S may grow with the number of observations n. We show that among distributions with bounded entropy H , the asymptotic maximum risk for the e...

متن کامل

Minimax Estimation of the Scale Parameter in a Family of Transformed Chi-Square Distributions under Asymmetric Squared Log Error and MLINEX Loss Functions

This paper is concerned with the problem of finding the minimax estimators of the scale parameter ? in a family of transformed chi-square distributions, under asymmetric squared log error (SLE) and modified linear exponential (MLINEX) loss functions, using the Lehmann Theorem [2]. Also we show that the results of Podder et al. [4] for Pareto distribution are a special case of our results for th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1801.05362  شماره 

صفحات  -

تاریخ انتشار 2018